How is it that it's the Christian Americans that think America is the greatest country in the world?

What part of the Bible do you read and you go, "Yup. According to this inerrant word of God USA is the greatest country in the world?"

Is it the ridiculously expensive healthcare?
Is it the nuke stockpile big enough to destroy the earth multiple times over?
Maybe it's because we are the only country to ever use nuclear weapons on another country...and we did it TWICE.
Is it the 37,000 homeless veterans?
Is it the genocide of the natives who were living here first?
Was it the chattel slavery?
Is it the largest prison population in the world?

I literally can't figure it out.

It's the obscene amount of money and the fact that corporations don't pay taxes, isn't it?